Goto

Collaborating Authors

 shear force


Physics-Informed Extreme Learning Machine (PIELM) for Tunnelling-Induced Soil-Pile Interactions

Guo, Fu-Chen, Zhuang, Pei-Zhi, Ren, Fei, Yue, Hong-Ya, Yang, He

arXiv.org Artificial Intelligence

Physics-informed machine learning has been a promising data-driven and physics-informed approach in geotechnical engineering. This study proposes a physics-informed extreme learning machine (PIELM) framework for analyzing tunneling-induced soil-pile interactions. The pile foundation is modeled as an Euler-Bernoulli beam, and the surrounding soil is modeled as a Pasternak foundation. The soil-pile interaction is formulated into a fourth-order ordinary differential equation (ODE) that constitutes the physics-informed component, while measured data are incorporated into PIELM as the data-driven component. Combining physics and data yields a loss vector of the extreme learning machine (ELM) network, which is trained within 1 second by the least squares method. After validating the PIELM approach by the boundary element method (BEM) and finite difference method (FDM), parametric studies are carried out to examine the effects of ELM network architecture, data monitoring locations and numbers on the performance of PIELM. The results indicate that monitored data should be placed at positions where the gradients of pile deflections are significant, such as at the pile tip/top and near tunneling zones. Two application examples highlight the critical role of physics-informed and data-driven approach for tunnelling-induced soil-pile interactions. The proposed approach shows great potential for real-time monitoring and safety assessment of pile foundations, and benefits for intelligent early-warning systems in geotechnical engineering.


Shear-based Grasp Control for Multi-fingered Underactuated Tactile Robotic Hands

Ford, Christopher J., Li, Haoran, Catalano, Manuel G., Bianchi, Matteo, Psomopoulou, Efi, Lepora, Nathan F.

arXiv.org Artificial Intelligence

This paper presents a shear-based control scheme for grasping and manipulating delicate objects with a Pisa/IIT anthropomorphic SoftHand equipped with soft biomimetic tactile sensors on all five fingertips. These `microTac' tactile sensors are miniature versions of the TacTip vision-based tactile sensor, and can extract precise contact geometry and force information at each fingertip for use as feedback into a controller to modulate the grasp while a held object is manipulated. Using a parallel processing pipeline, we asynchronously capture tactile images and predict contact pose and force from multiple tactile sensors. Consistent pose and force models across all sensors are developed using supervised deep learning with transfer learning techniques. We then develop a grasp control framework that uses contact force feedback from all fingertip sensors simultaneously, allowing the hand to safely handle delicate objects even under external disturbances. This control framework is applied to several grasp-manipulation experiments: first, retaining a flexible cup in a grasp without crushing it under changes in object weight; second, a pouring task where the center of mass of the cup changes dynamically; and third, a tactile-driven leader-follower task where a human guides a held object. These manipulation tasks demonstrate more human-like dexterity with underactuated robotic hands by using fast reflexive control from tactile sensing.


Bio-Skin: A Cost-Effective Thermostatic Tactile Sensor with Multi-Modal Force and Temperature Detection

Guo, Haoran, Wang, Haoyang, Li, Zhengxiong, Tao, Lingfeng

arXiv.org Artificial Intelligence

Tactile sensors can significantly enhance the perception of humanoid robotics systems by providing contact information that facilitates human-like interactions. However, existing commercial tactile sensors focus on improving the resolution and sensitivity of single-modal detection with high-cost components and densely integrated design, incurring complex manufacturing processes and unaffordable prices. In this work, we present Bio-Skin, a cost-effective multi-modal tactile sensor that utilizes single-axis Hall-effect sensors for planar normal force measurement and bar-shape piezo resistors for 2D shear force measurement. A thermistor coupling with a heating wire is integrated into a silicone body to achieve temperature sensation and thermostatic function analogous to human skin. We also present a cross-reference framework to validate the two modalities of the force sensing signal, improving the sensing fidelity in a complex electromagnetic environment. Bio-Skin has a multi-layer design, and each layer is manufactured sequentially and subsequently integrated, thereby offering a fast production pathway. After calibration, Bio-Skin demonstrates performance metrics-including signal-to-range ratio, sampling rate, and measurement range-comparable to current commercial products, with one-tenth of the cost. The sensor's real-world performance is evaluated using an Allegro hand in object grasping tasks, while its temperature regulation functionality was assessed in a material detection task.


Structured Pneumatic Fingerpads for Actively Tunable Grip Friction

Allison, Katherine, Kelly, Jonathan, Hatton, Benjamin

arXiv.org Artificial Intelligence

Grip surfaces with tunable friction can actively modify contact conditions, enabling transitions between higher- and lower-friction states for grasp adjustment. Friction can be increased to grip securely and then decreased to gently release (e.g., for handovers) or manipulate in-hand. Recent friction-tuning surface designs using soft pneumatic chambers show good control over grip friction; however, most require complex fabrication processes and/or custom gripper hardware. We present a practical structured fingerpad design for friction tuning that uses less than \$1 USD of materials, takes only seconds to repair, and is easily adapted to existing grippers. Our design uses surface morphology changes to tune friction. The fingerpad is actuated by pressurizing its internal chambers, thereby deflecting its flexible grip surface out from or into these chambers. We characterize the friction-tuning capabilities of our design by measuring the shear force required to pull an object from a gripper equipped with two independently actuated fingerpads. Our results show that varying actuation pressure and timing changes the magnitude of friction forces on a gripped object by up to a factor of 2.8. We demonstrate additional features including macro-scale interlocking behaviour and pressure-based object detection.


Digitizing Touch with an Artificial Multimodal Fingertip

Lambeta, Mike, Wu, Tingfan, Sengul, Ali, Most, Victoria Rose, Black, Nolan, Sawyer, Kevin, Mercado, Romeo, Qi, Haozhi, Sohn, Alexander, Taylor, Byron, Tydingco, Norb, Kammerer, Gregg, Stroud, Dave, Khatha, Jake, Jenkins, Kurt, Most, Kyle, Stein, Neal, Chavira, Ricardo, Craven-Bartle, Thomas, Sanchez, Eric, Ding, Yitian, Malik, Jitendra, Calandra, Roberto

arXiv.org Artificial Intelligence

Touch is a crucial sensing modality that provides rich information about object properties and interactions with the physical environment. Humans and robots both benefit from using touch to perceive and interact with the surrounding environment (Johansson and Flanagan, 2009; Li et al., 2020; Calandra et al., 2017). However, no existing systems provide rich, multi-modal digital touch-sensing capabilities through a hemispherical compliant embodiment. Here, we describe several conceptual and technological innovations to improve the digitization of touch. These advances are embodied in an artificial finger-shaped sensor with advanced sensing capabilities. Significantly, this fingertip contains high-resolution sensors (~8.3 million taxels) that respond to omnidirectional touch, capture multi-modal signals, and use on-device artificial intelligence to process the data in real time. Evaluations show that the artificial fingertip can resolve spatial features as small as 7 um, sense normal and shear forces with a resolution of 1.01 mN and 1.27 mN, respectively, perceive vibrations up to 10 kHz, sense heat, and even sense odor. Furthermore, it embeds an on-device AI neural network accelerator that acts as a peripheral nervous system on a robot and mimics the reflex arc found in humans. These results demonstrate the possibility of digitizing touch with superhuman performance. The implications are profound, and we anticipate potential applications in robotics (industrial, medical, agricultural, and consumer-level), virtual reality and telepresence, prosthetics, and e-commerce. Toward digitizing touch at scale, we open-source a modular platform to facilitate future research on the nature of touch.


Learning Force Distribution Estimation for the GelSight Mini Optical Tactile Sensor Based on Finite Element Analysis

Helmut, Erik, Dziarski, Luca, Funk, Niklas, Belousov, Boris, Peters, Jan

arXiv.org Artificial Intelligence

Contact-rich manipulation remains a major challenge in robotics. Optical tactile sensors like GelSight Mini offer a low-cost solution for contact sensing by capturing soft-body deformations of the silicone gel. However, accurately inferring shear and normal force distributions from these gel deformations has yet to be fully addressed. In this work, we propose a machine learning approach using a U-net architecture to predict force distributions directly from the sensor's raw images. Our model, trained on force distributions inferred from Finite Element Analysis (FEA), demonstrates promising accuracy in predicting normal and shear force distributions. It also shows potential for generalization across sensors of the same type and for enabling real-time application. The codebase, dataset and models are open-sourced and available at https://feats-ai.github.io .


Deep Domain Adaptation Regression for Force Calibration of Optical Tactile Sensors

Chen, Zhuo, Ou, Ni, Jiang, Jiaqi, Luo, Shan

arXiv.org Artificial Intelligence

Optical tactile sensors provide robots with rich force information for robot grasping in unstructured environments. The fast and accurate calibration of three-dimensional contact forces holds significance for new sensors and existing tactile sensors which may have incurred damage or aging. However, the conventional neural-network-based force calibration method necessitates a large volume of force-labeled tactile images to minimize force prediction errors, with the need for accurate Force/Torque measurement tools as well as a time-consuming data collection process. To address this challenge, we propose a novel deep domain-adaptation force calibration method, designed to transfer the force prediction ability from a calibrated optical tactile sensor to uncalibrated ones with various combinations of domain gaps, including marker presence, illumination condition, and elastomer modulus. Experimental results show the effectiveness of the proposed unsupervised force calibration method, with lowest force prediction errors of 0.102N (3.4\% in full force range) for normal force, and 0.095N (6.3\%) and 0.062N (4.1\%) for shear forces along the x-axis and y-axis, respectively. This study presents a promising, general force calibration methodology for optical tactile sensors.


Robust Anthropomorphic Robotic Manipulation through Biomimetic Distributed Compliance

Junge, Kai, Hughes, Josie

arXiv.org Artificial Intelligence

The impressive capabilities of humans to robustly perform manipulation relies on compliant interactions, enabled through the structure and materials spatially distributed in our hands. We propose by mimicking this distributed compliance in an anthropomorphic robotic hand, the open-loop manipulation robustness increases and observe the emergence of human-like behaviours. To achieve this, we introduce the ADAPT Hand equipped with tunable compliance throughout the skin, fingers, and the wrist. Through extensive automated pick-and-place tests, we show the grasping robustness closely mirrors an estimated geometric theoretical limit, while `stress-testing' the robot hand to perform 800+ grasps. Finally, 24 items with largely varying geometries are grasped in a constrained environment with a success rate of 93%. We demonstrate the hand-object self-organization behavior underlines this extreme robustness, where the hand automatically exhibits different grasp types depending on object geometries. Furthermore, the robot grasp type mimics a natural human grasp with a direct similarity of 68%.


FingerTac -- An Interchangeable and Wearable Tactile Sensor for the Fingertips of Human and Robot Hands

Sathe, Prathamesh, Schmitz, Alexander, Funabashi, Satoshi, Tomo, Tito Pradhono, Somlor, Sophon, Shigeki, Sugano

arXiv.org Artificial Intelligence

Skill transfer from humans to robots is challenging. Presently, many researchers focus on capturing only position or joint angle data from humans to teach the robots. Even though this approach has yielded impressive results for grasping applications, reconstructing motion for object handling or fine manipulation from a human hand to a robot hand has been sparsely explored. Humans use tactile feedback to adjust their motion to various objects, but capturing and reproducing the applied forces is an open research question. In this paper we introduce a wearable fingertip tactile sensor, which captures the distributed 3-axis force vectors on the fingertip. The fingertip tactile sensor is interchangeable between the human hand and the robot hand, meaning that it can also be assembled to fit on a robot hand such as the Allegro hand. This paper presents the structural aspects of the sensor as well as the methodology and approach used to design, manufacture, and calibrate the sensor. The sensor is able to measure forces accurately with a mean absolute error of 0.21, 0.16, and 0.44 Newtons in X, Y, and Z directions, respectively.


Touch, press and stroke: a soft capacitive sensor skin

Sarwar, Mirza S., Ishizaki, Ryusuke, Morton, Kieran, Preston, Claire, Nguyen, Tan, Fan, Xu, Dupont, Bertille, Hogarth, Leanna, Yoshiike, Takahide, Mirabbasi, Shahriar, Madden, John D. W.

arXiv.org Artificial Intelligence

Soft sensors that can discriminate shear and normal force could help provide machines the fine control desirable for safe and effective physical interactions with people. A capacitive sensor is made for this purpose, composed of patterned elastomer and containing both fixed and sliding pillars that allow the sensor to deform and buckle, much like skin itself. The sensor differentiates between simultaneously applied pressure and shear. In addition, finger proximity is detectable up to 15 mm, with a pressure and shear sensitivity of 1 kPa and a displacement resolution of 50 m. The operation is demonstrated on a simple gripper holding a cup. The combination of features and the straightforward fabrication method make this sensor a candidate for implementation as a sensing skin for humanoid robotics applications. Summary A 3-axis capacitive sensor with a dielectric composed of elastomer pillars creates a skinlike deformation that allows detection of approach, light touch, pressure and shear. MAIN TEXT Introduction To accommodate for complex interactions between humans and robots, it is important to design a method for touch identification that can be active on fingertips and other sensing surfaces. Ideally, the approach will be scalable to cover over most of a robot's surface area, forming an artificial or electronic skin (1, 2). Such a technology is also sought for neurally controlled prosthetic devices to enhance motor control (3, 4). The functional requirements of an artificial skin include the ability to sense and differentiate tactile stimuli such as light touch, pressure and shear (1). Having a smooth and soft skin, rather than a hard or bumpy surface, helps make the surface more lifelike, while the compliance allows for lower bandwidth control systems. There is a plethora of work on flexible touch and pressure sensors.